Multi-Stage Dantzig Selector

نویسندگان

  • Ji Liu
  • Peter Wonka
  • Jieping Ye
چکیده

We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ Rn×m (m À n) and a noisy observation vector y ∈ R satisfying y = Xβ∗ + 2 where 2 is the noise vector following a Gaussian distribution N(0, σI), how to recover the signal (or parameter vector) β∗ when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β∗. We show that if X obeys a certain condition, then with a large probability the difference between the solution β̂ estimated by the proposed method and the true solution β∗ measured in terms of the lp norm (p ≥ 1) is bounded as ‖β̂ − β∗‖p ≤ ( C(s−N)1/p √ log m + ∆ ) σ, where C is a constant, s is the number of nonzero entries in β∗, ∆ is independent of m and is much smaller than the first term, and N is the number of entries of β∗ larger than a certain value in the order of O(σ√log m). The proposed method improves the estimation bound of the standard Dantzig selector approximately from Cs √ log mσ to C(s−N)1/p√log mσ where the value N depends on the number of large entries in β∗. When N = s, the proposed algorithm achieves the oracle solution with a high probability. In addition, with a large probability, the proposed method can select the same number of correct features under a milder condition than the Dantzig selector.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Multi-Stage Framework for Dantzig Selector and LASSO

We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ Rn×m (m ≫ n) and a noisy observation vector y ∈ Rn satisfying y = Xβ∗ + ε where ε is the noise vector following a Gaussian distribution N(0,σ2I), how to recover the signal (or parameter vector) β∗ when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery ...

متن کامل

DASSO: Connections Between the Dantzig Selector and Lasso

We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the LARS algorithm that is used to compute the Lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex-like algorithm, which is remarkably similar to LARS. Comparison of the two algorithms sheds new light on the question of how ...

متن کامل

Dantzig selector homotopy with dynamic measurements

The Dantzig selector is a near ideal estimator for recovery of sparse signals from linear measurements in the presence of noise. It is a convex optimization problem which can be recast into a linear program (LP) for real data, and solved using some LP solver. In this paper we present an alternative approach to solve the Dantzig selector which we call “Primal Dual pursuit” or “PD pursuit”. It is...

متن کامل

The Double Dantzig

The Dantzig selector (Candes and Tao, 2007) is a new approach that has been proposed for performing variable selection and model fitting on linear regression models. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and Dantzig selector potentially do a good job of selecting the correct variables, several researcher...

متن کامل

Parallelism, uniqueness, and large-sample asymptotics for the Dantzig selector.

The Dantzig selector (Candès and Tao, 2007) is a popular ℓ1-regularization method for variable selection and estimation in linear regression. We present a very weak geometric condition on the observed predictors which is related to parallelism and, when satisfied, ensures the uniqueness of Dantzig selector estimators. The condition holds with probability 1, if the predictors are drawn from a co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010